Skip to main content

W12. Intelligence 01

We'll explore how neural networks learn to recognize patterns by building functions from data. Starting from the challenge of digit recognition, we'll construct networks neuron by neuron, discovering why layers create hierarchical representations and how non-linearity enables networks to approximate any function. We'll train neural networks ourselves using TensorFlow Playground, where you'll see how gradient descent navigates high-dimensional parameter spaces to minimize error. We'll connect these ideas to course themes of emergence and distributed cognition, examining how simple computations combine to produce complex behavior.


Pre-readings and Videos

The videos today explain the basics of neural networks.

Analog Perceptron

A beautiful demonstration of a perceptron using an analog system.

Original Perceptron Paper

McCulloch and Pitts' original research paper that defined the mathematical model of a neuron.

Original Perceptron Engineering Paper

Rosenblatt's original research paper that described his perceptron machine.

Neural Networks

The beginning video of a series on deep learning and neural networks.

Gradient Descent

Gradient descent is one of the core mechanics of backpropogation.


Summary of the Day


Learning Goals

  1. Understand neural networks as function approximators that learn mappings from training data
  2. Understand the basic perceptron as introduced by McCulloch and Pitts and originally implemented by Rosenblatt.
  3. Design a single perceptron's weights to achieve a specific classification task.